Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities

نویسندگان

  • Frank Nielsen
  • Ke Sun
چکیده

Information-theoretic measures such as the entropy, cross-entropy and the KullbackLeibler divergence between two mixture models is a core primitive in many signal processing tasks. Since the Kullback-Leibler divergence of mixtures provably does not admit a closed-form formula, it is in practice either estimated using costly Monte-Carlo stochastic integration, approximated, or bounded using various techniques. We present a fast and generic method that builds algorithmically closed-form lower and upper bounds on the entropy, the cross-entropy and the Kullback-Leibler divergence of mixtures. We illustrate the versatile method by reporting on our experiments for approximating the Kullback-Leibler divergence between univariate exponential mixtures, Gaussian mixtures, Rayleigh mixtures, and Gamma mixtures.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Weak log-majorization inequalities of singular values between normal matrices and their absolute values

‎This paper presents two main results that the singular values of the Hadamard product of normal matrices $A_i$ are weakly log-majorized by the singular values of the Hadamard product of $|A_{i}|$ and the singular values of the sum of normal matrices $A_i$ are weakly log-majorized by the singular values of the sum of $|A_{i}|$‎. ‎Some applications to these inequalities are also given‎. ‎In addi...

متن کامل

Some new bounds on the general sum--connectivity index

Let $G=(V,E)$ be a simple connectedgraph with $n$ vertices, $m$ edges and sequence of vertex degrees$d_1 ge d_2 ge cdots ge d_n>0$, $d_i=d(v_i)$, where $v_iin V$. With $isim j$ we denote adjacency ofvertices $v_i$ and $v_j$. The generalsum--connectivity index of graph is defined as $chi_{alpha}(G)=sum_{isim j}(d_i+d_j)^{alpha}$, where $alpha$ is an arbitrary real<b...

متن کامل

Measure Concentration, Transportation Cost, and Functional Inequalities

— In these lectures, we present a triple description of the concentration of measure phenomenon, geometric (through BrunnMinkoswki inequalities), measure-theoretic (through transportation cost inequalities) and functional (through logarithmic Sobolev inequalities), and investigate the relationships between these various viewpoints. Special emphasis is put on optimal mass transportation and the ...

متن کامل

Information theoretic inequalities

The role of inequalities in information theory is reviewed and the relationship of these inequalities to inequalities in other branches of mathematics is developed. Index Terms -Information inequalities, entropy power, Fisher information, uncertainty principles. I. PREFACE:~NEQUALITIES ININFORMATIONTHEORY I NEQUALITIES in information theory have been driven by a desire to solve communication th...

متن کامل

Concentration around the Mean for Maxima of Empirical Processes

In this paper we are interested in concentration inequalities for Z = sup{Sn(s) :s ∈ S}. Now let us recall the main results in this direction. Starting from concentration inequalities for product measures, Talagrand (1996) obtained Bennett type upper bounds on the Laplace transform of Z via concentration inequalities for product measures. More precisely he proved logE exp(tZ)≤ tE(Z) + V ab(e − ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Entropy

دوره 18  شماره 

صفحات  -

تاریخ انتشار 2016